AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Multi-domain Fine-tuning

# Multi-domain Fine-tuning

Discolm Mixtral 8x7b V2
Apache-2.0
Experimental 8x7b Mixture of Experts model developed based on Mistral AI's Mixtral 8x7b, fine-tuned on Synthia, MetaMathQA, and Capybara datasets
Large Language Model Transformers English
D
DiscoResearch
205
124
Gpt2023
MIT
A 124M-parameter language model based on the GPT-2 architecture, fine-tuned on 2.23B tokens of diverse data with improved text generation capabilities
Large Language Model Transformers English
G
crumb
136
18
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase